There exists a number of mathematical procedures for designing discrete-time compensators. However, the digital implementation of these designs, with a microprocessor for example, has not received nearly as thorough an investigation. The finite-precision nature of the digital hardware makes it necessary to choose an algorithm (computational structure) that will perform 'well-enough' with regard to the initial objectives of the design. This paper describes a procedure for estimating the required fixed-point coefficient wordlength for any given computational structure for the implementation of a single-input single-output LOG design. The results are compared to the actual number of bits necessary to achieve a specified performance index.